منابع مشابه
Quasi-Newton Methods: A New Direction
Four decades after their invention, quasiNewton methods are still state of the art in unconstrained numerical optimization. Although not usually interpreted thus, these are learning algorithms that fit a local quadratic approximation to the objective function. We show that many, including the most popular, quasi-Newton methods can be interpreted as approximations of Bayesian linear regression u...
متن کاملA quasi-Newton algorithm for large-scale nonlinear equations
In this paper, the algorithm for large-scale nonlinear equations is designed by the following steps: (i) a conjugate gradient (CG) algorithm is designed as a sub-algorithm to obtain the initial points of the main algorithm, where the sub-algorithm's initial point does not have any restrictions; (ii) a quasi-Newton algorithm with the initial points given by sub-algorithm is defined as main algor...
متن کاملA QUASI - NEWTON ACCELERATION OF THE EM ALGORITHM Kenneth
The EM algorithm is one of the most commonly used methods of maximum likelihood estimation. In many practical applications, it converges at a frustratingly slow linear rate. The current paper considers an acceleration of the EM algorithm based on classical quasi-Newton optimization techniques. This acceleration seeks to steer the EM algorithm gradually toward the Newton-Raphson algorithm, which...
متن کاملA Quasi-newton Adaptive Algorithm for Estimating Generalized Eigenvectors
We first introduce a constrained minimization formulation for the generalized symmetric eigenvalue problem and then recast it into an unconstrained minimization problem by constructing an appropriate cost function. Minimizer of this cost function corresponds to the eigenvector corresponding to the minimum eigenvalue of the given symmetric matrix pencil and all minimizers are global minimizers. ...
متن کاملA Generic Quasi-Newton Algorithm for Faster Gradient-Based Optimization
We propose a generic approach to accelerate gradient-based optimization algorithms with quasiNewton principles. The proposed scheme, called QuickeNing, can be applied to incremental first-order methods such as stochastic variance-reduced gradient (SVRG) or incremental surrogate optimization (MISO). It is also compatible with composite objectives, meaning that it has the ability to provide exact...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Applied Mathematics Letters
سال: 1988
ISSN: 0893-9659
DOI: 10.1016/0893-9659(88)90186-3